Boosting as a Product of Experts
نویسندگان
چکیده
In this paper, we derive a novel probabilistic model of boosting as a Product of Experts. We re-derive the boosting algorithm as a greedy incremental model selection procedure which ensures that addition of new experts to the ensemble does not decrease the likelihood of the data. These learning rules lead to a generic boosting algorithm POEBoost which turns out to be similar to the AdaBoost algorithm under certain assumptions on the expert probabilities. The paper then extends the POEBoost algorithm to POEBoost.CS which handles hypothesis that produce probabilistic predictions. This new algorithm is shown to have better generalization performance compared to other state of the art algorithms.
منابع مشابه
Explaining the initial level of “one village, one product strategy” in the development of the local economy Case: Nahrmian Rural District of Shazand County
Introduction By expanding markets and diversifying products from the desired local product, "One Village, One Product" strategy, taking advantage of the regionchr('39')s potential in the competitive market. Utilizing the "One Village, One Product" brand, as a global product, demonstrates pride in local culture as well as an effective economic strategy, which is intended for the development of...
متن کاملBoosted Pre-loaded Mixture of Experts for low-resolution face recognition
A modified version of Boosted Mixture of Experts (BME) for low-resolution face recognition is presented in this paper. Most of the methods developed for low-resolution face recognition focused on improving the resolution of face images and/or special feature extraction methods that can deal effectively with low-resolution problem. However, we focused on the classification step of face recogniti...
متن کاملOutlier Detection by Boosting Regression Trees
A procedure for detecting outliers in regression problems is proposed. It is based on information provided by boosting regression trees. The key idea is to select the most frequently resampled observation along the boosting iterations and reiterate after removing it. The selection criterion is based on Tchebychev’s inequality applied to the maximum over the boosting iterations of ...
متن کاملEnsemble Methods for Phoneme Classiication
In this paper we investigate a number of ensemble methods for improving the performance of phoneme classiication for use in a speech recognition system. We discuss boosting and mixtures of experts, both in isolation and in combination. We present results on an isolated word database. The results show that principled ensemble methods such as boosting and mixtures provide superior performance to ...
متن کاملLocalized Boosting
We introduce and analyze LocBoost, a new boosting algorithm, which leads to the incremental construction of a mixture of experts type architecture. We provide upper bounds on the expected loss of such models in terms of the smoothness properties of the gating functions appearing in the mixture of experts model. Furthermore, an incremental algorithm is proposed for the construction of the classi...
متن کامل